Convex optimization methods for dimension reduction and coefficient estimation in multivariate linear regression
نویسندگان
چکیده
In this paper, we study convex optimization methods for computing the nuclear (or, trace) norm regularized least squares estimate in multivariate linear regression. The so-called factor estimation and selection (FES) method, recently proposed by Yuan et al. [25], conducts parameter estimation and factor selection simultaneously and have been shown to enjoy nice properties in both large and finite samples. To compute the estimates, however, can be very challenging in practice because of the high dimensionality and the nuclear norm constraint. In this paper, we explore a variant due to Tseng [23] of Nesterov’s smooth method [17, 18] and interior point methods for computing the penalized least squares estimate. The performance of these methods is then compared using a set of randomly generated instances. We show that the variant of Nesterov’s smooth method generally outperforms the interior point method implemented in SDPT3 version 4.0 (beta) [22] substantially . Moreover, the former method is much more memory efficient.
منابع مشابه
Dimension reduction and coefficient estimation in multivariate linear regression
We introduce a general formulation for dimension reduction and coefficient estimation in the multivariate linear model. We argue that many of the existing methods that are commonly used in practice can be formulated in this framework and have various restrictions. We continue to propose a new method that is more flexible and more generally applicable. The method proposed can be formulated as a ...
متن کاملSparse Reduced Rank Regression With Nonconvex Regularization
In this paper, the estimation problem for sparse reduced rank regression (SRRR) model is considered. The SRRR model is widely used for dimension reduction and variable selection with applications in signal processing, econometrics, etc. The problem is formulated to minimize the least squares loss with a sparsity-inducing penalty considering an orthogonality constraint. Convex sparsity-inducing ...
متن کاملMoment Based Dimension Reduction for Multivariate Response Regression
Dimension reduction aims to reduce the complexity of a regression without requiring a pre-specified model. In the case of multivariate response regressions, covariance-based estimation methods for the k-th moment based dimension reduction subspaces circumvent slicing and nonparametric estimation so that they are readily applicable to multivariate regression settings. In this article, the covari...
متن کاملA note on extension of sliced average variance estimation to multivariate regression
Rand Corporation, Pittsburgh, PA 15213 e-mail: [email protected] Abstract: Many sufficient dimension reduction methodologies for univariate regression have been extended to multivariate regression. Sliced average variance estimation (SAVE) has the potential to recover more reductive information, and recent development enables us to test the dimension and predictor effects with distributions comm...
متن کاملPenalized Bregman Divergence Estimation via Coordinate Descent
Variable selection via penalized estimation is appealing for dimension reduction. For penalized linear regression, Efron, et al. (2004) introduced the LARS algorithm. Recently, the coordinate descent (CD) algorithm was developed by Friedman, et al. (2007) for penalized linear regression and penalized logistic regression and was shown to gain computational superiority. This paper explores...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- Math. Program.
دوره 131 شماره
صفحات -
تاریخ انتشار 2012